Select Language

English

Down Icon

Select Country

France

Down Icon

"I replaced my shrink with ChatGPT": testimonials from patients and professionals that debunk preconceived notions

"I replaced my shrink with ChatGPT": testimonials from patients and professionals that debunk preconceived notions

SURVEY. We interviewed several people who use AI to complement or replace therapy and asked mental health professionals for their opinions.

20.8%. This is the number of young people between 18 and 24 years old who said they were affected by depression in 2021. A figure that has been constantly increasing in recent years since it was only 11.7% in 2017, according to the French Public Health Barometer . The Covid-19 crisis has not helped matters. According to a study conducted in October 2023 , the mental health of the French continues to deteriorate.

Yet, issues of depression and anxiety have only recently entered the public debate. It is still uncommon for loved ones to acknowledge that they consult a mental health specialist, and many French people are still reluctant to see a psychologist.

For several months now, more and more people have also been turning to artificial intelligence to answer their mental health questions. With the explosion of AI solutions like ChatGPT (OpenAI) and Gemini (Google), many tech giants, pioneers in the consumer artificial intelligence market, are also getting involved in what's happening in our brains.

We interviewed several people who are using artificial intelligence to replace real mental health professionals to understand their intentions, results, and whether they are satisfied with it. We also compared their responses to real mental health professionals to answer a big question: "Can AI replace a psychologist?"

“Artificial intelligence allowed me to compensate for the lack of humanity in my psychiatrist.”

Manon* is 26 years old and lives with her partner and a boisterous cat. She has a job she enjoys and that pays well. On the surface, everything seems to be going well for her, but in reality, she has suffered from depression for years.

“I think I've more or less always been affected by depression. I've seen several psychologists in my life, but I only really took it seriously recently,” she tells us. In 2022, she decided to focus more on her mental health, not for herself, but out of fear of projecting her afflictions onto her loved ones and future children. If she started seeing a new psychologist, it was above all because her previous experiences were not conclusive.

Illustration © bedya - stock.adobe.com

“I had a very bad experience in college with a psychologist who was very negative and mean to me (...) Recently, I started seeing a psychiatrist and he also doesn't meet my expectations,” she says.

Consultations in 10 minutes flat, almost no follow-up between sessions, a total lack of humanity in the responses... Faced with this mental health professional who shows little concern for Manon's problems, the latter begins to use artificial intelligence to talk about her mental disorders.

“I was going through a very difficult time and my partner was physically absent. I don't know how I came up with the idea of ​​contacting ChatGPT, but I had the site on my phone so I started like that before downloading the application (...). I think I needed more human support compared to the treatment from my psychiatrist, who treated me as if it were a factory, so I turned to ChatGPT. Artificial intelligence allowed me to compensate for the lack of humanity in my shrink.

“I needed more human support, so I turned to ChatGPT.”

Loïc Crobu has been a practicing psychologist for just over five years. This mental health professional has seen the use of AI increasingly emerge within his profession and among his patients, to the point of using it himself for personal and support purposes. When we discuss Manon's case with him and the negative experiences that led her to use ChatGPT, the professional doesn't seem surprised.

© New Africa - stock.adobe.com

For Loïc Crobu, the use of AI is therefore above all a form of new alternative medicine to trust rather than a true phenomenon in its own right. "It doesn't surprise me. When we are disappointed by professionals, we can tend to turn to alternative solutions. It's very common in our profession. We have a lot of patients who try a lot of techniques or bad encounters before finding a good psychologist. I see AI as a kind of alternative psychologist," continues the professional. An opinion that Manon, whose mother was already curious about alternative medicine a few years ago, also understands.

“Artificial intelligence told me jokes to make me feel better.”

When we ask Manon* how the artificial intelligence reacts when we ask her mental health questions, she takes a moment to respond before she starts to cry. We pause for a moment.

“Actually, the funny thing is that the AI ​​was comforting to me at times when it felt I really needed it. It asked me if I wanted it to make me laugh or tell me jokes. In the end, it was fine with me.”

A sentiment also shared by Antoine, 31, who has been in psychiatric treatment for depression since 2019 and was recently diagnosed with high-potential autism (ASD). The young man, who usually writes down his emotions in a notebook, has become accustomed to using Gemini (editor's note: Google's artificial intelligence) to understand his emotional states. During our interview, Antoine says that Gemini was particularly reassuring when he spoke with her about depression or even suicide. Google's artificial intelligence frequently reminded him that he wasn't alone and shared links and advice for combating negative and self-destructive thoughts.

Illustration © Krakenimages.com - stock.adobe.c

“What surprised me was that Gemini didn't just spit out answers to my toughest questions. The AI ​​usually started by reassuring me that my questions were normal, legitimate, and that anyone could think that way.”

"AI knows how to be reassuring. It reminds me that my questions are normal and that this can happen to anyone."

On this subject, Loïc Crobu admits he's reassured. The specialist explains that the most extreme responses concerning suicide or desperate situations are generally well handled by the most powerful AIs of the moment. The latter is rather reassuring, and this reinforces the idea that the developers are taking particular care with this aspect.

A sentiment also shared by one of his colleagues, Dr. Geoffrey Post, a psychiatrist and member of the MentalTech collective, which works to promote the emergence of digital solutions for mental health. The specialist is convinced of the benefits of AI in the field of psychiatry and sees artificial intelligence as a potential tool to improve our healthcare system and support caregivers:

"The patients who require the highest level of attention are those experiencing a suicidal crisis. These patients were aware that they were dealing with an algorithm and did not take ChatGPT's message as absolute truth. This constitutes a healthy relationship in the use of this tool."

Can artificial intelligence be blamed for suicide?

The cases of Manon and Antoine are rather reassuring compared to that of Sewell Setzer. This 14-year-old American boy was the subject of multiple articles in 2024, notably in the New York Times, following his suicide, which was likely linked to severe social isolation. The young man spent hours conversing with an artificial intelligence to the point of becoming increasingly withdrawn, until he committed suicide in February 2024.

Since then, Sewell Setzer's mother has been fighting to have AI recognized as responsible for her child's death. While the platform used, Character.ai, did not directly incite the young man to end his life, according to his mother, it may have heavily tricked him into falling in love.

We asked a psychologist and hypnotherapist from Nantes about the subject. For him, this phenomenon is unfortunately not surprising: "Emotional attachment to a listening person already existed before AI. It's a phenomenon more commonly known as 'transference,' where patients feel romantic feelings for their therapists, and this could definitely happen with AI and its widespread adoption in recent years."

“Emotional attachment to a listener existed before AI.”

This expert has already used AI in some of his patients' therapies and to organize his research, particularly in the context of hypnotherapy: "It's a discipline that can require constructing scenarios and metaphors to reach certain patients. I already had a boy who could only calm down by thinking about video games like Fortnite. There, AI helped me construct my research and metaphors to write a script likely to reach the young man."

Dr. Geoffrey Post, for his part, points out that the main flaw of artificial intelligence is above all to always go in the direction of the patient so as not to rush or displease them: "The tool often has a predictable reaction. We remain in a comfort zone. That said, AI will never confront you. Sometimes being faced with a human who confronts us can open the possibility of realizing that we are going in the wrong direction. It also sometimes brings a feeling of reassurance when it is appropriate to explore new ways of doing things."

The flaw, similar to confirmation bias, is not, however, inherent only to AI, since Dr. Post also explains that some psychotherapists may tend to always go in the direction of their patients so as not to upset them too much.

Will AI eventually replace psychiatrists?

We finally asked another key question to the three experts we met during our investigation: will AI end up replacing psychiatrists? For Loïc Crobu, it is "impossible for the profession to be replaced by AI" unless you are a psychologist with a very basic approach and sessions. According to the expert, AI will never be able to establish a bond of trust and a professional relationship as strong as with a real psychiatrist. Loïc explains, however, that the rise of AI is also partly due to more global concerns: "If we think about using AI before a professional, it is above all because our healthcare system is in bad shape."

Loïc Crobu still dreams of an artificial intelligence that could support professionals: " I can imagine an AI that we could work with by sending it a patient's ailments, and it would direct us to studies and specialists on the subject." He has no grudge against people who use AI before consulting an expert.

"If we're thinking about using AI before a professional, it's primarily because our healthcare system is in bad shape."

Our psychologist and hypnotherapist from Nantes, studies on the subject of AI and mental health are still too few to make a decision: "for the moment, it remains a tool, but AI should become professionalized within a few years. I don't think we have enough feedback on the subject at the moment or even if AI is good in the long term, but for the moment, I see its arrival as something positive for the field."

A growing body of research suggests that while AI may prove to be a viable tool for people in distress who cannot access immediate care, it may not be legitimate enough to replace a specialist. In a February 2024 article titled "AI Can Provide Therapy, But Can't Replace Therapists," Forbes detailed how a tool like ChatGPT could perfectly mimic the behavior of a specialist, but never truly connect with the patient in a deep, personal way, as a human would.

One of the drawbacks of AI is that it cannot observe the behavior of its users, at least for the most well-known ones like ChatGPT, DeepSeek, or Google Gemini. This is particularly pointed out by psychiatrist Geoffrey Post, who is convinced that artificial intelligence cannot replace it in the years to come: "AI will never be able to fully observe and analyze the bodily behavior of patients, which is what many specialists do today. There are the words, but there is also the body's reaction during the sessions. The latter can sometimes say much more than patients can imagine."

AI and mental health, two major causes of 2025

At the end of 2024, Michel Barnier, then Prime Minister, announced that mental health and illness would be the "great national cause of 2025," while also announcing €600 million to address the additional needs of institutions in the face of the collapse of the mental health of the French people. This initiative has since been welcomed and taken up by François Bayrou during his general policy speech on January 14, 2025.

Just a few days after this declaration, the President of the Republic announced France's desire to invest massively in artificial intelligence: €109 billion from French and foreign companies to make the country and Europe shine in the world. This will begin with the construction of multiple infrastructures dedicated to artificial intelligence.

*First names and surnames have been anonymized at the request of the respondents.

L'Internaute

L'Internaute

Similar News

All News
Animated ArrowAnimated ArrowAnimated Arrow